Assessing individual agreement.

نویسندگان

  • Huiman X Barnhart
  • Andrzej S Kosinski
  • Michael J Haber
چکیده

Evaluating agreement between measurement methods or between observers is important in method comparison studies and in reliability studies. Often we are interested in whether a new method can replace an existing invasive or expensive method, or whether multiple methods or multiple observers can be used interchangeably. Ideally, interchangeability is established only if individual measurements from different methods are similar to replicated measurements from the same method. This is the concept of individual equivalence. Interchangeability between methods is similar to bioequivalence between drugs in bioequivalence studies. Following the FDA guidelines on individual bioequivalence, we propose to assess individual agreement among multiple methods via individual equivalence using the moment criteria. In the case where there is a reference method, we extend the individual bioequivalence criteria to individual equivalence criteria and propose to use individual equivalence coefficient (IEC) to compare multiple methods to one or multiple references. In the case where there is no reference method available, we propose a new IEC to assess individual agreement between multiple methods. Furthermore, we propose a coefficient of individual agreement (CIA) that links the IEC with two recent agreement indices. A method of moments is used for estimation, where one can utilize output from ANOVA models. The nonparametric and bootstrap approaches are used for inference. Five examples are used for illustration.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Comparison of concordance correlation coefficient and coefficient of individual agreement in assessing agreement.

In method comparison and reliability studies, it is often important to assess agreement between multiple measurements made by different methods, devices, laboratories, observers, or instruments. For continuous data, the concordance correlation coefficient (CCC) is a popular index for assessing agreement between multiple methods on the same subject where none of the methods is treated as referen...

متن کامل

Interobserver Agreement in Assessing Dysplasia in Colorectal Adenomatous Polyps: A Multicentric Iranian Study

Background & Objective: Most colorectal cancers (CRCs) arise from adenomatous polyps, and clinical management of this type of polyp is highly dependent on the reliability and validity of the pathological diagnosis. The aim of this study was to examine the interobserver agreement of five pathologists in assessing dysplasia in adenomatous polyps. Methods: I...

متن کامل

Assessing quality of volunteer crowdsourcing contributions: lessons from the Cropland Capture game

Volunteered Geographical Information (VGI) is the assembly of spatial information based on public input. While VGI has proliferated in recent years, assessing the quality of volunteercontributed data has proven challenging, leading some to question the efficiency of such programs. In this paper, we compare several quality metrics for individual volunteers’ contributions. The data was the produc...

متن کامل

Agreement and Coverage of Indicators of Response to Intervention: a Multi-method Comparison and Simulation.

PURPOSE Agreement across methods for identifying students as inadequate responders or as learning disabled is often poor. We report (1) an empirical examination of final status (post-intervention benchmarks) and dual-discrepancy growth methods based on growth during the intervention and final status for assessing response to intervention; and (2) a statistical simulation of psychometric issues ...

متن کامل

Focused on Categorization of a Continuum

Agreement on unitizing, where several annotators freely put units of various sizes and categories on a continuum, is difficult to assess because of the simultaneaous discrepancies in positioning and categorizing. The recent agreement measure γ offers an overall solution that simultaneously takes into account positions and categories. In this article, I propose the additional coefficient γcat, w...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Journal of biopharmaceutical statistics

دوره 17 4  شماره 

صفحات  -

تاریخ انتشار 2007